Current Issue : July-September Volume : 2025 Issue Number : 3 Articles : 5 Articles
With the rapid development of computer technology, cloud computing has been widely used in all walks of life, but the consequent network security problems have become increasingly prominent. This paper introduces data encryption technology to study the security challenges in cloud computing. Firstly, the basic characteristics of cloud computing are analyzed, and the security vulnerabilities of cloud computing are discussed. This study examines the function of data encryption technology in cloud computing in light of these issues, and deeply analyzes the applications and solutions of technologies for symmetric and asymmetric encryption, technologies for link and end-to-end data encryption both and node data encryption technology, in order to provide a reference for improving data security in cloud computing environment. The proposed data encryption scheme not only ensures data security but also effectively reduces the impact of encryption on the performance of cloud computing, demonstrating high practicality and feasibility. The research in this paper offers fresh concepts and techniques for the field of cloud computing security and is of great significance for promoting the wide application of cloud computing technology....
In recent years, with the continuous development of cloud computing technology, the importance of cloud computing data center, as the infrastructure of cloud computing has become increasingly prominent. Cloud computing data centers need to meet the requirements of high efficiency, high availability, scalability, security and other aspects to ensure the stable operation of cloud computing services and a good experience for users. This paper first introduces the basic concept and development status of cloud computing data center, then analyzes the key elements and objectives of cloud computing data center architecture design, and then proposes an efficient cloud computing data center architecture design scheme, and the effectiveness and performance of the scheme are verified through experiments. The experimental results show that this scheme can improve the resource utilization rate of cloud computing data center, reduce energy consumption, improve the reliability and scalability of the system, and has certain practical application value....
Accurate long-term cloud demand forecasting is critical for optimizing resource procurement and cost management in cloud computing, yet it remains challenging due to dynamic demand trends, limited historical data, and the poor generalization of existing models in few-shot scenarios. This paper proposes DimAug-TimesFM, a dimensionaugmented framework for long-term cloud demand forecasting, which addresses these challenges through two key innovations. First, Delivery Period Extracting identifies critical resource delivery phases by analyzing smoothed utilization trends and differencing thresholds, enabling focused modeling on periods reflecting actual demand. Second, Dimension-Augmented TimesFM enhances the pretrained TimesFM model by integrating cross-pool data via Dynamic Time Warping based similarity matching, enriching training data while mitigating distribution discrepancies. Experiments on real-world cloud resource utilization data demonstrate that DimAug-TimesFM significantly outperforms SOTA baselines (e.g., TimesFM, DLinear, PatchTST) in both short-term (16-day) and longterm (64-day and 128-day) forecasting tasks, achieving average reductions 72.9–81.7% in RMSE. DimAug-TimesFM also exhibits better robustness in scenarios where TimesFM fails, attributed to its synergistic integration of temporal feature enhancement and cross-pool data augmentation. This work provides a practical solution for few-shot cloud demand forecasting, enabling enterprises to align resource allocation with dynamic usage patterns and reduce operational costs....
Human tracking is a fundamental technology for mobile robots that work with humans. Various devices are used to observe humans, such as cameras, RGB-D sensors, millimeter-wave radars, and laser range finders (LRF). Typical LRF measurements observe only the surroundings on a particular horizontal plane. Human recognition using an LRF has a low computational load and is suitable for mobile robots. However, it is vulnerable to variations in human height, potentially leading to detection failures for individuals taller or shorter than the standard height. This work aims to develop a method that is robust to height differences among humans using a 3D LiDAR. We observed the environment using a 3D LiDAR and projected the point cloud onto a single horizontal plane to apply a human-tracking method for 2D LRFs. We investigated the optimal height range of the point clouds for projection and found that using 30% of the point clouds from the top of the measured person provided the most stable tracking. The results of the path-following experiments revealed that the proposed method reduced the proportion of outlier points compared to projecting all the points (from 3.63% to 1.75%). As a result, the proposed method was effective in achieving robust human following....
3D point clouds primarily consist of irregular, sparse data, dominated by background elements. The inherent irregularity of 3D point clouds induces elevated data movement, while the predominance of background points significantly amplifies computational requirements. Inspired by the substantial overlap of background points in adjacent frames, we introduce a pruning technique that exploits temporal correlations across successive frames to reduce redundant computations and expedite inference. This methodology optimizes computational resources to process valuable and highly correlated data, rather than indiscriminately processing entire point clouds. To further accelerate performance, we enhance data movement using Single Instruction Multiple Data (SIMD) techniques, optimizing the time-intensive Gather and Scatter operations within the dataflow. We compare it with the state-of-the-art sparse inference engine TorchSparse 2.0 to show our proposed method can achieve 1.2× speedup for MinkUnet and SPVCNN, without a significant accuracy loss. In particular, our SIMD-based data movement can achieve more than 5× speedup....
Loading....